213 research outputs found

    Extremal limits and black hole entropy

    Get PDF
    Taking the extremal limit of a non-extremal Reissner-Nordstr\"om black hole (by externally varying the mass or charge), the region between the inner and outer event horizons experiences an interesting fate -- while this region is absent in the extremal case, it does not disappear in the extremal limit but rather approaches a patch of AdS2×S2AdS_2\times S^2. In other words, the approach to extremality is not continuous, as the non-extremal Reissner-Nordstr\"om solution splits into two spacetimes at extremality: an extremal black hole and a disconnected AdSAdS space. We suggest that the unusual nature of this limit may help in understanding the entropy of extremal black holes.Comment: 10 pages, 3 figures. Minor corrections and added reference

    Burns and long-term infectious disease morbidity: A population-based study

    Get PDF
    Background: There is a growing volume of data that indicates that serious injury suppresses immune function, predisposing individuals to infectious complications. With recent evidence showing long-term immune dysfunction after less severe burn, this study aimed to investigate post-burn infectious disease morbidity and assess if burn patients have increased long-term hospital use for infectious diseases. Methods: A population-based longitudinal study using linked hospital morbidity and death data from Western Australia for all persons hospitalised for a first burn (n=30,997) in 1980-2012. A frequency matched non-injury comparison cohort was randomly selected from Western Australia's birth registrations and electoral roll (n=123,399). Direct standardisation was used to assess temporal trends in infectious disease admissions. Crude annual admission rates and length of stay for infectious diseases were calculated. Multivariate negative binomial and Cox proportional hazards regression modeling were used to generate adjusted incidence rate ratios (IRR) and hazard ratios (HR), respectively. Results: After adjustment for demographic factors and pre-existing health status, the burn cohort had twice (IRR, 95% confidence interval (CI): 2.04, 1.98-2.22) as many admissions and 3.5 times the number of days in hospital (IRR, 95%CI: 3.46, 3.05-3.92) than the uninjured cohort for infectious diseases. Higher rates of infectious disease admissions were found for severe (IRR, 95%CI: 2.37, 1.89-2.97) and minor burns (IRR, 95%CI: 2.22, 2.11-2.33). Burns were associated with significantly increased incident admissions: 0-30days (HR, 95%CI: 5.18, 4.15-6.48); 30days-1year (HR, 95%CI: 1.69, 1.53-1.87); 1-10 years (HR, 95%CI: 1.40:1.33-1.47); >10years (HR, 95%CI: 1.16, 1.08-1.24). Respiratory, skin and soft tissue and gastrointestinal infections were the most common. The burn cohort had a 1.75 (95%CI: 1.37-2.25) times greater rate of mortality caused by infectious diseases during the 5-year period after discharge than the uninjured cohort. Conclusions: These findings suggest that burn has long-lasting effects on the immune system and its function. The increase in infectious disease in three different epithelial tissues in the burn cohort suggests there may be common underlying pathophysiology. Further research to understand the underlying mechanisms are required to inform clinical interventions to mitigate infectious disease after burn and improve patient outcomes

    Puma predation on radiocollared and uncollared bighorn sheep

    Get PDF
    BackgroundWe used Global Positioning System (GPS) data from radiocollared pumas (Puma concolor) to identify kill sites of pumas preying upon an endangered population of bighorn sheep (Ovis canadensis) in southern California. Our aims were to test whether or not pumas selected radiocollared versus uncollared bighorn sheep, and to identify patterns of movement before, during, and after kills.FindingsThree pumas killed 23 bighorn sheep over the course of the study, but they did not preferentially prey on marked (radiocollared) versus unmarked bighorn sheep. Predation occurred primarily during crepuscular and nighttime hours, and 22 kill sites were identified by the occurrence of 2 or more consecutive puma GPS locations (a cluster) within 200 m of each other at 1900, 0000, and 0600 h.ConclusionWe tested the "conspicuous individual hypothesis" and found that there was no difference in puma predation upon radiocollared and uncollared bighorn sheep. Pumas tended to move long distances before and after kills, but their movement patterns immediately post-kill were much more restricted. Researchers can exploit this behaviour to identify puma kill sites and investigate prey selection by designing studies that detect puma locations that are spatially clustered between dusk and dawn

    Modeling the transmission and thermal emission in a pupil image behind the Keck II adaptive optics system

    Get PDF
    The design and performance of astronomical instruments depend critically on the total system throughput as well as the background emission from the sky and instrumental sources. In designing a pupil stop for background- limited imaging, one seeks to balance throughput and background rejection to optimize measurement signal-to-noise ratios. Many sources affect transmission and emission in infrared imaging behind the Keck Observatory’s adaptive optics systems, such as telescope segments, segment gaps, secondary support structure, and AO bench optics. Here we describe an experiment, using the pupil-viewing mode of NIRC2, to image the pupil plane as a function of wavelength. We are developing an empirical model of throughput and background emission as a function of position in the pupil plane. This model will be used in part to inform the optimal design of cold pupils in future instruments, such as the new imaging camera for OSIRIS

    Dynamical compactification from de Sitter space

    Get PDF
    We show that D-dimensional de Sitter space is unstable to the nucleation of non-singular geometries containing spacetime regions with different numbers of macroscopic dimensions, leading to a dynamical mechanism of compactification. These and other solutions to Einstein gravity with flux and a cosmological constant are constructed by performing a dimensional reduction under the assumption of q-dimensional spherical symmetry in the full D-dimensional geometry. In addition to the familiar black holes, black branes, and compactification solutions we identify a number of new geometries, some of which are completely non-singular. The dynamical compactification mechanism populates lower-dimensional vacua very differently from false vacuum eternal inflation, which occurs entirely within the context of four-dimensions. We outline the phenomenology of the nucleation rates, finding that the dimensionality of the vacuum plays a key role and that among vacua of the same dimensionality, the rate is highest for smaller values of the cosmological constant. We consider the cosmological constant problem and propose a novel model of slow-roll inflation that is triggered by the compactification process.Comment: Revtex. 41 pages with 24 embedded figures. Minor corrections and added reference

    BPS Domain Wall Junctions in Infinitely Large Extra Dimensions

    Full text link
    We consider models of scalar fields coupled to gravity which are higher-dimensional generalizations of four dimensional supergravity. We use these models to describe domain wall junctions in an anti-de Sitter background. We derive Bogomolnyi equations for the scalar fields from which the walls are constructed and for the metric. From these equations a BPS-like formula for the junction energy can be derived. We demonstrate that such junctions localize gravity in the presence of more than one uncompactified extra dimension.Comment: 17 pages, uses RevTeX, new references adde

    Estimating parameters for probabilistic linkage of privacy-preserved datasets.

    Get PDF
    Background: Probabilistic record linkage is a process used to bring together person-based records from within the same dataset (de-duplication) or from disparate datasets using pairwise comparisons and matching probabilities. The linkage strategy and associated match probabilities are often estimated through investigations into data quality and manual inspection. However, as privacy-preserved datasets comprise encrypted data, such methods are not possible. In this paper, we present a method for estimating the probabilities and threshold values for probabilistic privacy-preserved record linkage using Bloom filters. Methods: Our method was tested through a simulation study using synthetic data, followed by an application using real-world administrative data. Synthetic datasets were generated with error rates from zero to 20% error. Our method was used to estimate parameters (probabilities and thresholds) for de-duplication linkages. Linkage quality was determined by F-measure. Each dataset was privacy-preserved using separate Bloom filters for each field. Match probabilities were estimated using the expectation-maximisation (EM) algorithm on the privacy-preserved data. Threshold cut-off values were determined by an extension to the EM algorithm allowing linkage quality to be estimated for each possible threshold. De-duplication linkages of each privacy-preserved dataset were performed using both estimated and calculated probabilities. Linkage quality using the F-measure at the estimated threshold values was also compared to the highest F-measure. Three large administrative datasets were used to demonstrate the applicability of the probability and threshold estimation technique on real-world data. Results: Linkage of the synthetic datasets using the estimated probabilities produced an F-measure that was comparable to the F-measure using calculated probabilities, even with up to 20% error. Linkage of the administrative datasets using estimated probabilities produced an F-measure that was higher than the F-measure using calculated probabilities. Further, the threshold estimation yielded results for F-measure that were only slightly below the highest possible for those probabilities. Conclusions: The method appears highly accurate across a spectrum of datasets with varying degrees of error. As there are few alternatives for parameter estimation, the approach is a major step towards providing a complete operational approach for probabilistic linkage of privacy-preserved datasets

    Classical Stabilization of Homogeneous Extra Dimensions

    Get PDF
    If spacetime possesses extra dimensions of size and curvature radii much larger than the Planck or string scales, the dynamics of these extra dimensions should be governed by classical general relativity. We argue that in general relativity, it is highly nontrivial to obtain solutions where the extra dimensions are static and are dynamically stable to small perturbations. We also illustrate that intuition on equilibrium and stability built up from non-gravitational physics can be highly misleading. For all static, homogeneous solutions satisfying the null energy condition, we show that the Ricci curvature of space must be nonnegative in all directions. Much of our analysis focuses on a class of spacetime models where space consists of a product of homogeneous and isotropic geometries. A dimensional reduction of these models is performed, and their stability to perturbations that preserve the spatial symmetries is analyzed. We conclude that the only physically realistic examples of classically stabilized large extra dimensions are those in which the extra-dimensional manifold is positively curved.Comment: 25 pages; minor changes, improved reference

    Evaluating privacy-preserving record linkage using cryptographic long-term keys and multibit trees on large medical datasets.

    Get PDF
    Background: Integrating medical data using databases from different sources by record linkage is a powerful technique increasingly used in medical research. Under many jurisdictions, unique personal identifiers needed for linking the records are unavailable. Since sensitive attributes, such as names, have to be used instead, privacy regulations usually demand encrypting these identifiers. The corresponding set of techniques for privacy-preserving record linkage (PPRL) has received widespread attention. One recent method is based on Bloom filters. Due to superior resilience against cryptographic attacks, composite Bloom filters (cryptographic long-term keys, CLKs) are considered best practice for privacy in PPRL. Real-world performance of these techniques using large-scale data is unknown up to now. Methods: Using a large subset of Australian hospital admission data, we tested the performance of an innovative PPRL technique (CLKs using multibit trees) against a gold-standard derived from clear-text probabilistic record linkage. Linkage time and linkage quality (recall, precision and F-measure) were evaluated. Results: Clear text probabilistic linkage resulted in marginally higher precision and recall than CLKs. PPRL required more computing time but 5 million records could still be de-duplicated within one day. However, the PPRL approach required fine tuning of parameters. Conclusions: We argue that increased privacy of PPRL comes with the price of small losses in precision and recall and a large increase in computational burden and setup time. These costs seem to be acceptable in most applied settings, but they have to be considered in the decision to apply PPRL. Further research on the optimal automatic choice of parameters is needed

    Modeling the transmission and thermal emission in a pupil image behind the Keck II adaptive optics system

    Get PDF
    The design and performance of astronomical instruments depend critically on the total system throughput as well as the background emission from the sky and instrumental sources. In designing a pupil stop for background- limited imaging, one seeks to balance throughput and background rejection to optimize measurement signal-to-noise ratios. Many sources affect transmission and emission in infrared imaging behind the Keck Observatory’s adaptive optics systems, such as telescope segments, segment gaps, secondary support structure, and AO bench optics. Here we describe an experiment, using the pupil-viewing mode of NIRC2, to image the pupil plane as a function of wavelength. We are developing an empirical model of throughput and background emission as a function of position in the pupil plane. This model will be used in part to inform the optimal design of cold pupils in future instruments, such as the new imaging camera for OSIRIS
    corecore